Unified approach to coefficient-based regularized regression

نویسندگان

  • Yun-Long Feng
  • Shao-Gao Lv
چکیده

In this paper, we consider the coefficient-based regularized least-squares regression problem with the lq-regularizer (1 ≤ q ≤ 2) and data dependent hypothesis spaces. Algorithms in data dependent hypothesis spaces perform well with the property of flexibility. We conduct a unified error analysis by a stepping stone technique. An empirical covering number technique is also employed in our study to improve sample error. Comparing with existing results, we make a few improvements: First, we obtain a significantly sharper learning rate that can be arbitrarily close toO(m−1) under reasonable conditions, which is regarded as the best learning rate in learning theory. Second, our results cover the case q = 1, which is novel. Finally, our results hold under very general conditions. © 2011 Elsevier Ltd. All rights reserved.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Regularized fuzzy clusterwise ridge regression

Fuzzy clusterwise regression has been a useful method for investigating cluster-level heterogeneity of observations based on linear regression. This method integrates fuzzy clustering and ordinary least-squares regression, thereby enabling to estimate regression coefficients for each cluster and fuzzy cluster memberships of observations simultaneously. In practice, however, fuzzy clusterwise re...

متن کامل

Network Intrusion Detection through Discriminative Feature Selection by Using Sparse Logistic Regression

Intrusion detection system (IDS) is a well-known and effective component of network security that provides transactions upon the network systems with security and safety. Most of earlier research has addressed difficulties such as overfitting, feature redundancy, high-dimensional features and a limited number of training samples but feature selection. We approach the problem of feature selectio...

متن کامل

Regularized Laplacian Estimation and Fast Eigenvector Approximation

Recently, Mahoney and Orecchia demonstrated that popular diffusion-based procedures to compute a quick approximation to the first nontrivial eigenvector of a data graph Laplacian exactly solve certain regularized Semi-Definite Programs (SDPs). In this paper, we extend that result by providing a statistical interpretation of their approximation procedure. Our interpretation will be analogous to ...

متن کامل

Network-regularized Sparse Logistic Regression Models for Clinical Risk Prediction and Biomarker Discovery

Molecular profiling data (e.g., gene expression) has been used for clinical risk prediction and biomarker discovery. However, it is necessary to integrate other prior knowledge like biological pathways or gene interaction networks to improve the predictive ability and biological interpretability of biomarkers. Here, we first introduce a general regularized Logistic Regression (LR) framework wit...

متن کامل

Bundle Methods for Machine Learning

We present a globally convergent method for regularized risk minimization problems. Our method applies to Support Vector estimation, regression, Gaussian Processes, and any other regularized risk minimization setting which leads to a convex optimization problem. SVMPerf can be shown to be a special case of our approach. In addition to the unified framework we present tight convergence bounds, w...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Computers & Mathematics with Applications

دوره 62  شماره 

صفحات  -

تاریخ انتشار 2011